347 research outputs found

    Spectral mixture analysis of EELS spectrum-images

    Get PDF
    Recent advances in detectors and computer science have enabled the acquisition and the processing of multidimensional datasets, in particular in the field of spectral imaging. Benefiting from these new developments, earth scientists try to recover the reflectance spectra of macroscopic materials (e.g., water, grass, mineral types...) present in an observed scene and to estimate their respective proportions in each mixed pixel of the acquired image. This task is usually referred to as spectral mixture analysis or spectral unmixing (SU). SU aims at decomposing the measured pixel spectrum into a collection of constituent spectra, called endmembers, and a set of corresponding fractions (abundances) that indicate the proportion of each endmember present in the pixel. Similarly, when processing spectrum-images, microscopists usually try to map elemental, physical and chemical state information of a given material. This paper reports how a SU algorithm dedicated to remote sensing hyperspectral images can be successfully applied to analyze spectrum-image resulting from electron energy-loss spectroscopy (EELS). SU generally overcomes standard limitations inherent to other multivariate statistical analysis methods, such as principal component analysis (PCA) or independent component analysis (ICA), that have been previously used to analyze EELS maps. Indeed, ICA and PCA may perform poorly for linear spectral mixture analysis due to the strong dependence between the abundances of the different materials. One example is presented here to demonstrate the potential of this technique for EELS analysis.Comment: Manuscript accepted for publication in Ultramicroscop

    Joint segmentation of wind speed and direction using a hierarchical model

    Get PDF
    The problem of detecting changes in wind speed and direction is considered. Bayesian priors, with various degrees of certainty, are used to represent relationships between the two time series. Segmentation is then conducted using a hierarchical Bayesian model that accounts for correlations between the wind speed and direction. A Gibbs sampling strategy overcomes the computational complexity of the hierarchical model and is used to estimate the unknown parameters and hyperparameters. Extensions to other statistical models are also discussed. These models allow us to study other joint segmentation problems including segmentation of wave amplitude and direction. The performance of the proposed algorithms is illustrated with results obtained with synthetic and real data

    Bayesian orthogonal component analysis for sparse representation

    Get PDF
    This paper addresses the problem of identifying a lower dimensional space where observed data can be sparsely represented. This under-complete dictionary learning task can be formulated as a blind separation problem of sparse sources linearly mixed with an unknown orthogonal mixing matrix. This issue is formulated in a Bayesian framework. First, the unknown sparse sources are modeled as Bernoulli-Gaussian processes. To promote sparsity, a weighted mixture of an atom at zero and a Gaussian distribution is proposed as prior distribution for the unobserved sources. A non-informative prior distribution defined on an appropriate Stiefel manifold is elected for the mixing matrix. The Bayesian inference on the unknown parameters is conducted using a Markov chain Monte Carlo (MCMC) method. A partially collapsed Gibbs sampler is designed to generate samples asymptotically distributed according to the joint posterior distribution of the unknown model parameters and hyperparameters. These samples are then used to approximate the joint maximum a posteriori estimator of the sources and mixing matrix. Simulations conducted on synthetic data are reported to illustrate the performance of the method for recovering sparse representations. An application to sparse coding on under-complete dictionary is finally investigated.Comment: Revised version. Accepted to IEEE Trans. Signal Processin

    Estimating the number of endmembers in hyperspectral images using the normal compositional model and a hierarchical Bayesian algorithm.

    Get PDF
    This paper studies a semi-supervised Bayesian unmixing algorithm for hyperspectral images. This algorithm is based on the normal compositional model recently introduced by Eismann and Stein. The normal compositional model assumes that each pixel of the image is modeled as a linear combination of an unknown number of pure materials, called endmembers. However, contrary to the classical linear mixing model, these endmembers are supposed to be random in order to model uncertainties regarding their knowledge. This paper proposes to estimate the mixture coefficients of the Normal Compositional Model (referred to as abundances) as well as their number using a reversible jump Bayesian algorithm. The performance of the proposed methodology is evaluated thanks to simulations conducted on synthetic and real AVIRIS images

    Enhancing hyperspectral image unmixing with spatial correlations

    Get PDF
    This paper describes a new algorithm for hyperspectral image unmixing. Most of the unmixing algorithms proposed in the literature do not take into account the possible spatial correlations between the pixels. In this work, a Bayesian model is introduced to exploit these correlations. The image to be unmixed is assumed to be partitioned into regions (or classes) where the statistical properties of the abundance coefficients are homogeneous. A Markov random field is then proposed to model the spatial dependency of the pixels within any class. Conditionally upon a given class, each pixel is modeled by using the classical linear mixing model with additive white Gaussian noise. This strategy is investigated the well known linear mixing model. For this model, the posterior distributions of the unknown parameters and hyperparameters allow ones to infer the parameters of interest. These parameters include the abundances for each pixel, the means and variances of the abundances for each class, as well as a classification map indicating the classes of all pixels in the image. To overcome the complexity of the posterior distribution of interest, we consider Markov chain Monte Carlo methods that generate samples distributed according to the posterior of interest. The generated samples are then used for parameter and hyperparameter estimation. The accuracy of the proposed algorithms is illustrated on synthetic and real data.Comment: Manuscript accepted for publication in IEEE Trans. Geoscience and Remote Sensin

    CS Decomposition Based Bayesian Subspace Estimation

    Get PDF
    In numerous applications, it is required to estimate the principal subspace of the data, possibly from a very limited number of samples. Additionally, it often occurs that some rough knowledge about this subspace is available and could be used to improve subspace estimation accuracy in this case. This is the problem we address herein and, in order to solve it, a Bayesian approach is proposed. The main idea consists of using the CS decomposition of the semi-orthogonal matrix whose columns span the subspace of interest. This parametrization is intuitively appealing and allows for non informative prior distributions of the matrices involved in the CS decomposition and very mild assumptions about the angles between the actual subspace and the prior subspace. The posterior distributions are derived and a Gibbs sampling scheme is presented to obtain the minimum mean-square distance estimator of the subspace of interest. Numerical simulations and an application to real hyperspectral data assess the validity and the performances of the estimator

    Joint segmentation of multivariate astronomical time series : bayesian sampling with a hierarchical model

    Get PDF
    Astronomy and other sciences often face the problem of detecting and characterizing structure in two or more related time series. This paper approaches such problems using Bayesian priors to represent relationships between signals with various degrees of certainty, and not just rigid constraints. The segmentation is conducted by using a hierarchical Bayesian approach to a piecewise constant Poisson rate model. A Gibbs sampling strategy allows joint estimation of the unknown parameters and hyperparameters. Results obtained with synthetic and real photon counting data illustrate the performance of the proposed algorithm

    Minimum mean square distance estimation of a subspace

    Get PDF
    We consider the problem of subspace estimation in a Bayesian setting. Since we are operating in the Grassmann manifold, the usual approach which consists of minimizing the mean square error (MSE) between the true subspace UU and its estimate U^\hat{U} may not be adequate as the MSE is not the natural metric in the Grassmann manifold. As an alternative, we propose to carry out subspace estimation by minimizing the mean square distance (MSD) between UU and its estimate, where the considered distance is a natural metric in the Grassmann manifold, viz. the distance between the projection matrices. We show that the resulting estimator is no longer the posterior mean of UU but entails computing the principal eigenvectors of the posterior mean of UUTU U^{T}. Derivation of the MMSD estimator is carried out in a few illustrative examples including a linear Gaussian model for the data and a Bingham or von Mises Fisher prior distribution for UU. In all scenarios, posterior distributions are derived and the MMSD estimator is obtained either analytically or implemented via a Markov chain Monte Carlo simulation method. The method is shown to provide accurate estimates even when the number of samples is lower than the dimension of UU. An application to hyperspectral imagery is finally investigated

    Hyperspectral image unmixing using a multiresolution sticky HDP

    Get PDF
    This paper is concerned with joint Bayesian endmember extraction and linear unmixing of hyperspectral images using a spatial prior on the abundance vectors.We propose a generative model for hyperspectral images in which the abundances are sampled from a Dirichlet distribution (DD) mixture model, whose parameters depend on a latent label process. The label process is then used to enforces a spatial prior which encourages adjacent pixels to have the same label. A Gibbs sampling framework is used to generate samples from the posterior distributions of the abundances and the parameters of the DD mixture model. The spatial prior that is used is a tree-structured sticky hierarchical Dirichlet process (SHDP) and, when used to determine the posterior endmember and abundance distributions, results in a new unmixing algorithm called spatially constrained unmixing (SCU). The directed Markov model facilitates the use of scale-recursive estimation algorithms, and is therefore more computationally efficient as compared to standard Markov random field (MRF) models. Furthermore, the proposed SCU algorithm estimates the number of regions in the image in an unsupervised fashion. The effectiveness of the proposed SCU algorithm is illustrated using synthetic and real data
    • 

    corecore